E.M. Algorithm: A toll for analysis in pressence of missing value

 

The expectation maximization algorithm is an broadly applicable approach to the iterative computation of maximum likelihood (ML) estimate useful in variety of incomplete data problem, where algorithm such as Newton-Raphson methods turn out to be a more complicated. On each iteration of the algorithm there are two steps: Expectation step (E-step) and maximization step (M-step). Name due to Dempster Laird and Rubin (1977). For a detailed study on EM algorithm one can go through Lachlan and Krishnan (2008).

 

# R program to find MLE using EM Algorithm: Multinomial Example
# Example 1.1 page 8 McLachlan and Krishnan (2008) 

x1=125
x2=18
x3=20
x4=34
n=x1+x2+x3+x4
p=(x1-x2-x3+x4)/n             # Initial estimates
p                                      # To print the Initial estimate 
p0=p
x11=(0.5*x1)/(0.5+(p0/4))  # E Step in the first iteration
x12=x1-x11
p1=(x12+x4)/(n-x11)          # M Step in the first iteration
r=0                                   # To know the number of iteration  
e=0.00001
while(abs(p1-p0)>e)      
{
p0=p1
x11=(0.5*x1)/(0.5+(p0/4))
x12=x1-x11
p1=(x12+x4)/(n-x11)
r=r+1
}
p1          # This will print the MLE

r+1        # This will gives the number of iteration
             # Note that the number of iteration will change by changing p0

 

 References

1. Dempster AP,  Laird NM and Rubin DB ,1977, Maximum likelihood from incomplete data via the EM algorithm (with discussion). J Roy Stat Soc B 39:1–38

2.  McLachlan GJ, Krishnan T, 2008, The EM Algorithm and Extensions, 2nd Edition, Wiley Series in Probability and Statistics

 

back to notes

 

Back to homepage